Almost Linear VC Dimension Bounds for Piecewise Polynomial Networks

نویسندگان

  • Peter L. Bartlett
  • Vitaly Maiorov
  • Ron Meir
چکیده

We compute upper and lower bounds on the VC dimension and pseudo-dimension of feedforward neural networks composed of piecewise polynomial activation functions. We show that if the number of layers is fixed, then the VC dimension and pseudo-dimension grow as WlogW, where W is the number of parameters in the network. This result stands in opposition to the case where the number of layers is unbounded, in which case the VC dimension and pseudo-dimension grow as W2. We combine our results with recently established approximation error rates and determine error bounds for the problem of regression estimation by piecewise polynomial networks with unbounded weights.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Nearly-tight VC-dimension bounds for piecewise linear neural networks

We prove new upper and lower bounds on the VC-dimension of deep neural networks with the ReLU activation function. These bounds are tight for almost the entire range of parameters. Letting W be the number of weights and L be the number of layers, we prove that the VC-dimension is O(WL log(W )), and provide examples with VC-dimension Ω(WL log(W/L)). This improves both the previously known upper ...

متن کامل

Tight Bounds for the VC-Dimension of Piecewise Polynomial Networks

O(ws(s log d+log(dqh/ s))) and O(ws((h/ s) log q) +log(dqh/ s)) are upper bounds for the VC-dimension of a set of neural networks of units with piecewise polynomial activation functions, where s is the depth of the network, h is the number of hidden units, w is the number of adjustable parameters, q is the maximum of the number of polynomial segments of the activation function, and d is the max...

متن کامل

Vapnik-Chervonenkis Dimension of Recurrent Neural Networks

Most of the work on the Vapnik-Chervonenkis dimension of neural networks has been focused on feedforward networks. However, recurrent networks are also widely used in learning applications, in particular when time is a relevant parameter. This paper provides lower and upper bounds for the VC dimension of such networks. Several types of activation functions are discussed, including threshold, po...

متن کامل

Bounds for the Computational

It is shown that feedforward neural nets of constant depth with piecewise polynomial activation functions and arbitrary real weights can be simulated for boolean inputs and outputs by neural nets of a somewhat larger size and depth with heaviside gates and weights from f0; 1g. This provides the rst known upper bound for the computational power and VC-dimension of such neural nets. It is also sh...

متن کامل

VC dimension bounds for networks of spiking neurons

We calculate bounds on the VC dimension and pseudo dimension for networks of spiking neurons. The connections between network nodes are parameterized by transmission delays and synaptic weights. We provide bounds in terms of network depth and number of connections that are almost linear. For networks with few layers this yields better bounds than previously established results for networks of

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Neural computation

دوره 10 8  شماره 

صفحات  -

تاریخ انتشار 1998